Improving Access to CalFresh
  • Home
  • About
  • Key Findings
  • Analysis
  • Contact
  • Github

On this page

  • 1. Factors Associated With Approval
  • 2. Potential Improvements

Key Findings

This analysis draws on 2,046 CalFresh applications submitted online in San Diego County. It includes applicant details, online activity, and final approval outcomes — but does not capture actions taken outside the platform, such as mailed documents or phone interviews.

1. Factors Associated With Approval

To identify what drives approval, I fit a logistic regression model using variables selected for their program relevance, observed user behavior, and patterns found in exploratory analysis. The goal was not just to predict outcomes, but to understand which steps in the process matter most — and where applicants might drop off.

The model included:

  • Income (scaled in $500 increments)
  • Household size and presence of children or older adults
  • Document uploads (with and after application)
  • Application completion time
  • Housing stability
  • Interview completion (self-reported)

For interpretability, income was scaled in $500 increments. Application time was capped at 180 minutes to reduce the influence of extreme outliers. Interview completion was recoded into a two-level variable to include missing responses as “Not confirmed,” capturing drop-off behavior.

Key Findings:

  • Interview completion was the strongest predictor: Applicants who confirmed completing the interview had a 72% average predicted approval rate, versus 50% for others.
  • Uploading documents with the application increased approval chances — by about 13% per document, on average.
  • Higher income reduced the odds of approval: Each additional $500 in monthly income decreased approval odds by roughly one-third.
  • Many income-eligible applicants were not approved: Nearly half of those who met income criteria were denied, often due to missing interviews or documents. Income eligibility was assessed using CalFresh income limits based on household size.
  • Applicants with children had higher approval odds, even after controlling for household size.
  • ZIP code mattered: Approval rates varied significantly across ZIPs, suggesting geographic differences in access or processing.

The table below shows model results using odds ratios — a way to estimate how each factor affects the odds of approval, controlling for all others. For example, an odds ratio of 1.5 means the odds of approval are 50% higher for that group compared to the baseline.

Variable Odds Ratio 95% CI (Low) 95% CI (High) P-Value Interpretation
Baseline (reference) 2.02 1.51 2.71 0.00 Baseline odds (intercept)
Income (per $500) 0.67 0.62 0.71 0.00 Higher income was linked to lower approval
Household Size 0.83 0.68 1.01 0.07 Larger households were not significantly different
Children in Household 1.46 1.13 1.89 0.00 Each child increased odds of approval
Older Adults in Household 1.20 0.85 1.70 0.29 No significant effect
Docs Submitted With Application 1.13 1.07 1.19 0.00 Each document increased approval odds by ~13%
Docs Submitted After Application 1.07 1.01 1.14 0.02 Each document had a small positive effect
Application Time (minutes) 1.00 0.99 1.00 0.20 Longer applications showed no strong association
Stable Housing 0.90 0.70 1.15 0.38 No clear difference after accounting for other factors
Interview Completed 2.99 2.31 3.89 0.00 Strongest predictor of approval

The model explained a meaningful amount of variation in approval outcomes (McFadden R² = 0.15) and correctly distinguished approved vs. denied applications 76% of the time (AUC = 0.76). These results suggest the model performs well given the limited behavioral data available from the application platform.

To make the results easier to interpret, the table below shows predicted approval rates for example applicant scenarios, based on the fitted model.

Scenario Predicted Approval Rate
Completed interview + uploaded docs 84%
No interview, no docs 58%
Income > $1,500, small household 30%
Income < $500, submitted docs + interview 82%

These results show that relatively small steps — like completing an interview or uploading a document — can meaningfully increase the chance of approval. Many of these steps could be supported through timely nudges, simpler workflows, or user-centered reminders.

See full analysis walkthrough here for details on variable preparation, modeling assumptions, and diagnostics.

2. Potential Improvements

The model points to clear opportunities to improve approval outcomes by supporting applicants at key decision points.

Recommendations:

  • Support interview completion: Many eligible applicants do not confirm completing the interview. Providing reminders, real-time scheduling, or alternative follow-up methods could increase follow-through.
  • Encourage early document uploads: Uploading documents with the application was strongly associated with approval. Nudging users to upload early — especially those likely to qualify — could reduce denials.
  • Address geographic disparities: Approval rates vary significantly by ZIP code. Further analysis could explore whether these reflect staffing, broadband access, or population needs — and help inform place-based outreach strategies.

Strengthening these steps would not only increase approval rates, but also improve equity and access for those most in need.